language pair
How Handheld Translators Work and Why They're Handy for Travel
Your cell phone can handle basic language translation, but bespoke tools can offer a much more immersive experience. Hans Christian Andersen once said, "To travel is to live," and while that's a romantic notion, he probably wasn't careening through Gyeongju, South Korea, at midnight in the back of a taxi with a driver who didn't speak a lick of English. Today's world traveler has it awfully easy when it comes to understanding the local lingo, as even a basic modern cell phone app can offer a pretty good translation of common phrases delivered in everything from Abkhaz to Zulu. Type or speak a sentence or two into the app, tap a button, and out it returns in the language of your choice. Tap another button, and your phone can even speak those sentences aloud.
- Asia > South Korea (0.24)
- North America (0.14)
- Europe (0.14)
- North America > Canada > Ontario > Toronto (0.04)
- Asia (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- Information Technology > Artificial Intelligence > Natural Language > Machine Translation (0.73)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.69)
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (0.68)
- Asia > China > Beijing > Beijing (0.05)
- North America > Canada > Quebec > Montreal (0.04)
- Europe > Netherlands (0.04)
- Asia > Uzbekistan (0.04)
- Information Technology > Artificial Intelligence > Natural Language > Machine Translation (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.71)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (0.47)
- Europe > Denmark > Capital Region > Copenhagen (0.05)
- North America > United States > Pennsylvania > Allegheny County > Pittsburgh (0.04)
- North America > Canada > British Columbia > Metro Vancouver Regional District > Vancouver (0.04)
- Europe > Italy > Tuscany > Florence (0.04)
- Europe > Denmark > Capital Region > Copenhagen (0.04)
- Europe > Belgium > Brussels-Capital Region > Brussels (0.04)
- (3 more...)
- North America > United States > Minnesota > Hennepin County > Minneapolis (0.14)
- North America > Canada (0.04)
- Oceania > Australia > Victoria > Melbourne (0.04)
- (3 more...)
Deep Transformers with Latent Depth
The Transformer model has achieved state-of-the-art performance in many sequence modeling tasks. However, how to leverage model capacity with large or variable depths is still an open challenge. We present a probabilistic framework to automatically learn which layer(s) to use by learning the posterior distributions of layer selection. As an extension of this framework, we propose a novel method to train one shared Transformer network for multilingual machine translation with different layer selection posteriors for each language pair. The proposed method alleviates the vanishing gradient issue and enables stable training of deep Transformers (e.g. 100 layers). We evaluate on WMT English-German machine translation and masked language modeling tasks, where our method outperforms existing approaches for training deeper Transformers. Experiments on multilingual machine translation demonstrate that this approach can effectively leverage increased model capacity and bring universal improvement for both many-to-one and one-to-many translation with diverse language pairs.